Parametric Bayesian Estimation of Differential Entropy and Relative Entropy

نویسندگان

  • Maya R. Gupta
  • Santosh Srivastava
چکیده

Given iid samples drawn from a distribution with known parametric form, we propose the minimization of expected Bregman divergence to form Bayesian estimates of differential entropy and relative entropy, and derive such estimators for the uniform, Gaussian, Wishart, and inverse Wishart distributions. Additionally, formulas are given for a log gamma Bregman divergence and the differential entropy and relative entropy for the Wishart and inverse Wishart. The results, as always with Bayesian estimates, depend on the accuracy of the prior parameters, but example simulations show that the performance can be substantially improved compared to maximum likelihood or state-of-the-art nonparametric estimators.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

E-Bayesian Approach in A Shrinkage Estimation of Parameter of Inverse Rayleigh Distribution under General Entropy Loss Function

‎Whenever approximate and initial information about the unknown parameter of a distribution is available, the shrinkage estimation method can be used to estimate it. In this paper, first the $ E $-Bayesian estimation of the parameter of inverse Rayleigh distribution under the general entropy loss function is obtained. Then, the shrinkage estimate of the inverse Rayleigh distribution parameter i...

متن کامل

Some properties of the parametric relative operator entropy

The notion of entropy was introduced by Clausius in 1850, and some of the main steps towards the consolidation of the concept were taken by Boltzmann and Gibbs. Since then several extensions and reformulations have been developed in various disciplines with motivations and applications in different subjects, such as statistical mechanics, information theory, and dynamical systems. Fujii and Kam...

متن کامل

Bayesian entropy estimation for binary spike train data using parametric prior knowledge

Shannon’s entropy is a basic quantity in information theory, and a fundamental building block for the analysis of neural codes. Estimating the entropy of a discrete distribution from samples is an important and difficult problem that has received considerable attention in statistics and theoretical neuroscience. However, neural responses have characteristic statistical structure that generic en...

متن کامل

Maximum Probability and Relative Entropy Maximization. Bayesian Maximum Probability and Empirical Likelihood

Works, briefly surveyed here, are concerned with two basic methods: Maximum Probability and Bayesian Maximum Probability; as well as with their asymptotic instances: Relative Entropy Maximization and Maximum Non-parametric Likelihood. Parametric and empirical extensions of the latter methods – Empirical Maximum Maximum Entropy and Empirical Likelihood – are also mentioned. The methods are viewe...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Entropy

دوره 12  شماره 

صفحات  -

تاریخ انتشار 2010